-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache the set of active transfers in receive arbiter #319
Conversation
Check-perf-impact results: (000e2892abb21ddd1ae413a5c86d7d95) ❓ No new benchmark data submitted. ❓ |
Pull Request Test Coverage Report for Build 12789562762Details
💛 - Coveralls |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good stuff!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice
65604a0
to
03efe77
Compare
03efe77
to
2218430
Compare
Check-perf-impact results: (c8fb992b35322012b54e351345fdf71a) ❓ No new benchmark data submitted. ❓ |
Check-perf-impact results: (deac89ab3c981780086a8f2d08cf6170) 🚀 Significant speedup (<0.80x) in some microbenchmark results: benchmark intrusive graph dependency handling with N nodes - 100 / creating nodes Relative execution time per category: (mean of relative medians)
|
5849f0d
to
5afe621
Compare
receive_arbiter
tracks atransfer
for every transfer id it learns through an inbound pilot, so long-running programs likewave_sim -T 10000
will fill the transfer map quickly. It's anunordered_map
so lookups shouldn't suffer, but we iterate over it inpoll_communicator
, which has linear complexity (initially found by @psalz).We only need to poll transfers for which we have already seen the
receive_instruction
. This PR avoids the linear growth by caching the set of active transfers in a vector, and iterating over that.I've used the opportunity to refactor the code a bit and use
std::erase_if
in the many places we had erase-remove idiom before.